Embodied Conversational Agents (ECAs) are a promising medium for human-computer interaction, since they are capable of engaging users in real-time face-to-face interaction [1, 2]. Users’ formed impressions of an ECA (e.g. favour or dislike) could be reflected behaviourally [3, 4]. These impressions may affect the interaction and could even remain afterwards [5, 7]. Thus, when we build an ECA to impress users, it is important to detect how users feel about the ECA. The impression the ECA leaves can then be adjusted by controlling its non-verbal behaviour [7]. Motivated by the role of ECAs in interpersonal interaction and the state-of-the-art on affect recognition, we investigated three research questions: 1) which modality (facial expressions, eye movements, and physiological signals) reveals most of the formed impressions; 2) whether an ECA could leave a better impression by maximizing the impression it produces; 3) whether there are differences in impression formation during human-human vs. human-agent interaction. Our results firstly showed the interest to use different modalities to detect impressions. An ANOVA test indicated that facial expressions performance outperforms the physiological modality performance (M = 1.27, p = 0.02). Secondly, our results presented the possibility of creating an adaptive ECA. Compared with the randomly selected ECA behaviour, participants’ ratings tended to be higher in the conditions where the ECA adapted its behaviour based on the detected impressions. Thirdly, we found similar behaviour during human-human vs. human-agent interaction. People treated an ECA similarly to a human by spending more time observing the face area when forming an impression.

Impression Detection and Management Using an Embodied Conversational Agent / Wang, C.; Biancardi, B.; Mancini, M.; Cafaro, A.; Pelachaud, C.; Pun, T.; Chanel, G.. - 12182:(2020), pp. 260-278. (Intervento presentato al convegno Thematic Area on Human Computer Interaction, HCI 2020, held as part of the 22nd International Conference on Human-Computer Interaction, HCII 2020 tenutosi a Copenhagen) [10.1007/978-3-030-49062-1_18].

Impression Detection and Management Using an Embodied Conversational Agent

Mancini M.;
2020

Abstract

Embodied Conversational Agents (ECAs) are a promising medium for human-computer interaction, since they are capable of engaging users in real-time face-to-face interaction [1, 2]. Users’ formed impressions of an ECA (e.g. favour or dislike) could be reflected behaviourally [3, 4]. These impressions may affect the interaction and could even remain afterwards [5, 7]. Thus, when we build an ECA to impress users, it is important to detect how users feel about the ECA. The impression the ECA leaves can then be adjusted by controlling its non-verbal behaviour [7]. Motivated by the role of ECAs in interpersonal interaction and the state-of-the-art on affect recognition, we investigated three research questions: 1) which modality (facial expressions, eye movements, and physiological signals) reveals most of the formed impressions; 2) whether an ECA could leave a better impression by maximizing the impression it produces; 3) whether there are differences in impression formation during human-human vs. human-agent interaction. Our results firstly showed the interest to use different modalities to detect impressions. An ANOVA test indicated that facial expressions performance outperforms the physiological modality performance (M = 1.27, p = 0.02). Secondly, our results presented the possibility of creating an adaptive ECA. Compared with the randomly selected ECA behaviour, participants’ ratings tended to be higher in the conditions where the ECA adapted its behaviour based on the detected impressions. Thirdly, we found similar behaviour during human-human vs. human-agent interaction. People treated an ECA similarly to a human by spending more time observing the face area when forming an impression.
2020
Thematic Area on Human Computer Interaction, HCI 2020, held as part of the 22nd International Conference on Human-Computer Interaction, HCII 2020
Affective computing; Eye gaze; Impression detection; Impression management; Machine learning; Reinforcement learning; Virtual agent
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Impression Detection and Management Using an Embodied Conversational Agent / Wang, C.; Biancardi, B.; Mancini, M.; Cafaro, A.; Pelachaud, C.; Pun, T.; Chanel, G.. - 12182:(2020), pp. 260-278. (Intervento presentato al convegno Thematic Area on Human Computer Interaction, HCI 2020, held as part of the 22nd International Conference on Human-Computer Interaction, HCII 2020 tenutosi a Copenhagen) [10.1007/978-3-030-49062-1_18].
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1530537
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? ND
social impact